Skip to main content

Systematics

🚀 The Book in 3 Sentences

This book is about systems, why they exist, and the rules that govern them. The book dives into the fundamentals of human-made systems and pokes fun at the tribulations of complicated systems and how humans often fail to comprehend the systems.

🎨 Impressions

It was an interesting book, but some things are more difficult to understand than others. The style got a little dull at the end, and it was always cumbersome to understand. I was not able to grasp the intricacies of the book properly, and it left me longing for my knowledge of the subject. It is closely related to Thinking in Systems, which was clearer. It might be challenging to understand this subject because it is a very different way of approaching the world than people, myself included, are used to.

✍️ My Top Quotes

  • Russians, Chinese, Americans, and Africans may differ on everything else in the world, but they all agree that whatever the problem may be, the answer lies in setting up some system to deal with it.

  • Our purpose is to help the Systems-student minimize such experiences, become aware of the many opportunities of incurring such a shock, and be mentally prepared so that the shock will at least not be unexpected.

  • A world in which the hungry nations export food, the wealthiest nations slip into demoralizing economic recessions, the strongest nations go to war against the smallest and weakest and are unable to win, a world in which revolutions against tyrannical systems themselves become tyrannies.

  • Korzybski seemed to have convinced himself that all breakdowns of human Systems are attributable to misunderstandings—in brief, to failures of communication

  • That failure to function as expected to be expected and this behavior results from systems laws that are as rigorous as any in Natural Science or Mathematics.

  • NEW SYSTEMS MEAN NEW PROBLEMS

  • Dumping along the highway, this exemplifies the principle of Le Chatelier (The System Tends To Oppose Its Own Proper Function),

  • Systems are like babies: once you get one, you have it. They don’t go away. On the contrary, they display the most remarkable persistence. They not only persist; they grow. And as they grow, they encroach. The growth potential of Systems was explored in a tentative, preliminary way by Parkinson, who concluded that Administrative Systems maintain an average rate of growth of five to six percent per annum (corrected for inflation) regardless of the work to be done.

  • THE SYSTEM ITSELF (DAS SYSTEM AN UND FUER SICH) TENDS TO GROW AT 5-6% PER ANNUM

  • In the United States, the Internal Revenue Service not only collect our taxes, they also make us compute the tax for them, a task that produces a demonstrable shortening of both lifespan and temper.

  • Systems Expand, and as they expand, they Encroach.

  • Under precisely controlled experimental conditions, a test animal will behave as it damn well pleases.

  • Not just animal behavior, but the behavior of complex Systems generally, whether living or non-living, is unpredictable.

  • •The Queen Elizabeth ll, greatest ocean liner ever built, has three separate sets of boilers for safety, reliability, and speed. Yet on a recent cruise, in fine weather and a calm sea, all three sets of boilersfailed simultaneously

  • LARGE SYSTEM, PRODUCED BY EXPANDING THE DIMENSIONS OF A SMALLER SYSTEM, DOES NOT BEHAVE LIKE THE SMALLER SYSTEm

  • Le Chatelier’s Principle. This Law states that any natural process, whether physical or chemical, tends to set up conditions opposing the further operation of the process.

  • SYSTEMS TEND TO OPPOSE THEIR OWN PROPER FUNCTIONS

  • SYSTEMS TEND TO MALFUNCTION CONSPICUOUSLY JUST AFTER THEIR GREATEST TRIUMPH

  • PERFECTION OF PLANNING IS A SYMPTOM OF DECAY

  • UN has responded to famine in sub-Saharan Africa by planning a magnificent new Conference Center to accommodate UN personnel meeting to discuss this problem.

  • A TEMPORARY PATCH WILL VERY LIKELY BE PERMANENT

  • THE NAME IS MOST EMPHATICALLY NOT THE THING The Naming Fallacy is at the heart of the Grand Illusion and of the many forms of Systems-Delusions

  • *Our study of the Operational Fallacy has made clear how and why it is that

    1. large Systems really do not do what they purport to do and that
    2. people in large Systems are not actually performing the function ascribed to them. These two facts quite naturally lend an air of unreality to the entire operation of a large System.
  • Fundamental Law of Administrative Workings (F.L.A.W.): THINGS ARE WHAT THEY ARE REPORTED TO BE The observant reader has doubtless already noted various alternative formulations of this Axiom, all to the same general effect, for example: THE REAL WORLD IS WHAT IS REPORTED TO THE SYSTEM —or, in the world of Diplomacy: IF IT ISN’T OFFICIAL, IT HASN’T HAPPENED Amongst television journalists this Axiom takes the following form: IF IT DIDN’T HAPPEN ON CAMERA, IT DIDN’T HAPPEN Mind-boggling as it may seem, the converse proposition is also true.

  • THE BIGGER THE SYSTEM, THE NARROWER AND MORE SPECIALIZED THE INTERFACE WITH INDIVIDUALS

  • In very large Systems, the relationship is never with the individual at all, but with his Social Security number, his driver’s license, or some other paper phantom derived from an extremely specialized aspect of the person.

  • Taped to the wall of the nurses’ station, just above the Vital Signs Remote Sensing Console that enables the nurses to record whether the patient is breathing and even to take his pulse without actually going down the hall to see him, was the following hand-lettered reminder: THE CHART IS NOT THE PATIENT

  • These observations lead naturally to enunciation of the Jet Travel Paradox:[b.] WHEN YOU GET THERE, YOU’RE STILL NOT THERE

  • Manager’s Mirage. The belief that some event (usually called an “outcome”) was actually caused by the operation of the System. Examples: •The Public School System is obviously responsible for the literary works of Faulkner, Hemingway, and Arthur Miller, since it taught them to write

  • We generalize: THE SYSTEM TAKES THE CREDIT (FOR ANY FAVORABLE EVENTUALITY

  • Having thoroughly digested this introduction, we should have no trouble understanding that: IF A SYSTEM CAN BE EXPLOITED, IT WILL BE Nor will we cavil at its twin: ANY SYSTEM CAN BE EXPLOITED

  • “I never ruled Russia. Ten thousand clerks ruled Russia.” Thus spoke the Czar Alexander on his deathbed.

  • IF A SYSTEM IS WORKING, LEAVE IT ALONE. DON’T CHANGE ANYTHING

  • Charlemagne, for example, in his desire to be fair to his three sons, divided his empire among them—an act that gave rise to France, Germany, and Alsace-Lorraine, and to more than a thousand years of strife.

  • LARGE COMPLEX SYSTEMS ARE BEYOND HUMAN CAPACITY TO EVALUATE

  • As these examples suggest, it is probably wise to err on the side of simplicity. Practically speaking, any System with more than two elements should probably be regarded as complex, at least for purposes of human interaction.

  • SYSTEMS DEVELOP GOALS OF THEIR OWN THE INSTANT THEY COME INTO BEING Furthermore, it seems axiomatically clear that: INTRASYSTEM GOALS COME FIRST More subjectively stated: SYSTEMS DON’T WORK FOR YOU OR FOR ME. THEY WORK FOR THEIR OWN GOALS

  • The reader who masters this powerful Axiom can readily comprehend why the United Nations recently suspended its efforts at dealing with questions of detente, the Middle East, and the drought in North Africa in order to spend an entire day debating whether UN employees should continue to ride first-class on airplanes.

  • We, therefore, retreat from metaphysics to the simple, down-to-earth attitude: the Purpose of the System is—whatever it can be used for.

  • ANY LARGE SYSTEM IS GOING TO BE OPERATING MOST OF THE TIME IN FAILURE MODE

  • Our basic approach is indicated in the Fundamental Failure Theorem (F.F.T.): A SYSTEM CAN FAIL IN AN INFINITE NUMBER OF WAYS An extreme example is the government of Haiti, which, with one exception, consists entirely of Departments that do not function. Dozens of national and international aid agencies, frustrated by the inability of the Haitian Government to cope with outside assistance, have sent emergency representatives to Haiti, to teach the government officials how to fill out requests for aid

  • WHEN A FAIL-SAFE SYSTEM FAILS, IT FAILS BY FAILING TO FAIL SAFE —Nuclear strategists please note!

  • The student is invited to notice that the anti-meltdown device did not fail while performing its function as a fail-safe device. It failed during normal operations and in so doing it failed to fail safely; in fact, it caused the very accident it was designed to deal with.

  • AS SYSTEMS GROW IN SIZE AND COMPLEXITY, THEY TEND TO LOSE BASIC FUNCTIONS

  • •Thus, the loss of 50,000 American lives per year in auto accidentsis seen, not as a mortal flaw in our Transportation System, but merely as afact of life.

  • •In the field of Nuclear Weaponry, the power to exterminate one’s enemy (and concomitantly oneself) ten times over is regarded as not quite enough.

  • •Medical Science, in its studies of the relationship between heart disease and fats in the bloodstream, for twenty years focused its attention on the wrong fats

  • •In the field of Human Behavior, an enormous research effort is being directed to proving that Mental Illness is a disease of the brain. This conviction appears to stem at least in part from a Systems-delusion, namely: IF IT’S TREATED BY DOCTORS IT MUST BE A DISEASE

  • IN SETTING UP A NEW SYSTEM, TREAD SOFTLY. YOU MAY BE DISTURBING ANOTHER SYSTEM THAT IS ACTUALLY WORKING

  • EXPERIENCE ISN’T HEREDITARY—IT AIN’T EVEN CONTAGIOUS

  • IT IS IMPOSSIBLE TO NOT COMMUNICATE Think of the eighteen-inch tsetse fly without any explanatory lecture. It continues to tell a story, even though the story is different for each viewer. Or think, as Bateson reminded us, of “the letter which you do not write. . . the income tax form which you do not fill in . . .” which nevertheless elicit vigorous responses.

  • THE MEANING OF A COMMUNICATION IS THE BEHAVIOR THAT RESULTS This Axiom, which flies in the face of vulgar Common Sense, is basic. It is a nettle that must be grasped, and the sooner the better. Simply put: Are we willing to subject our communications to the test of actual outcomes?

  • THE MOST URGENTLY NEEDED INFORMATION DECAYS FASTEST

  • For most of us, however, the Decay Rate of Information remains merely an interesting abstraction, for our efforts at Coping are limited by an even more drastic law, the Inaccessibility Theorem: THE INFORMATION YOU HAVE IS NOT THE INFORMATION YOU WANT. THE INFORMATION YOU WANT IS NOT THE INFORMATION YOU NEED. THE INFORMATION YOU NEED IS NOT THE INFORMATION YOU CAN OBTAIN.

  • We summarize in the Rule of Thumb for missing information: DON’T BOTHER TO LOOK FOR IT. YOU WON’T FIND IT It will turn up later—when you no longer need it.

  • DO IT WITHOUT A NEW SYSTEM IF YOU CAN

  • *Two immediate Corollaries, with significant implications for Management, are as follows:

  1. DO IT WITH AN EXISTING SYSTEM IF YOU CAN
  2. DO IT WITH A SMALL SYSTEM IF YOU CAN
  • At this point one should be mindful of Agnes Allen’s Law:[i.] ALMOST ANYTHING IS EASIER TO GET INTO THAN OUT OF More specifically: TAKING IT DOWN IS OFTEN MORE TEDIOUS THAN SETTING IT UP

  • In human terms, this means working with human tendencies rather than against them. For example, a State-run lottery flourishes even in times of economic depression because its function is aligned with the basic human instinct to gamble a small stake in hopes of a large reward.

  • The Public School System, on the other hand, although founded with the highest and most altruistic goals in mind, remains in a state of chronic failure because it violates the principle of spontaneity in human learning. It goes against the grain and therefore it does not ever really succeed. It has made literacy universal, but not truly popular.

  • LOOSE SYSTEMS LAST LONGER AND FUNCTION BETTER Since most of modern life is lived in the interstices of large systems, it is of practical importance to note that LOOSE SYSTEMS HAVE LARGER INTERSTICES and are therefore generally somewhat less hostile to human life forms than tighter Systems.

  • Potemkin Village Effect. The P.V.E. is especially pronounced in Five-Year Plans, which typically report sensational overachievement during the first four and a half years, followed by a rash of criminal trials of top officials and the announcement of a new and more ambitious Five-Year Plan, starting from a baseline somewhat lower than that of the preceding Plan, but with higher goals

  • The complexity consultants can no more predict the future than the clients can. Mindful of Chaos Theory, we propose our own Emendation: THE FUTURE IS NO MORE PREDICTABLE NOW THAN IT WAS IN THE PAST, BUT YOU CAN AT LEAST TAKE NOTE OF TRENDS

  • •Kaiser Wilhelmof Germany, when still a boy, on a visit to his uncle, the King of England, saw the mighty English fleet and promptly dreamed of having one of his own,like Uncle Bertie’s, only bigger.

  • Systems can do many things, but one thing they emphatically cannot do is to solve Problems. A System represents someone’s solution to a Problem. The System itself does notsolveProblems

  • Legend has it that one of the lesser-known museums of Middle Eastern Archeology contains an ancient baked brick from the city of Ur of theChaldees upon which, five thousand years ago, a scribe had incised in cuneiform symbols the cryptic message:[a.] THE FINAL TRUTH IS JUST AROUND THE CORNER Although the original author of this insight is unknown, the belief remains alive to this day, being widely held with all the fixity of Revealed Religion, that WHEN THE CURRENT REVISION IS COMPLETE, THE SYSTEM WILL BE PERFECT Alternatively: PERFECTION CAN BE ACHIEVED ON THE DAY AFTER THE FINAL DEADLINE Needless to say, this idea, no matter how elegantly formulated, remains a Delusion. The truth of the matter is summarized inPerfectionist’s Paradox: IN DEALING WITH LARGE SYSTEMS, THE STRIVING FOR PERFECTION IS A SERIOUS IMPERFECTION

  • IF IT’S WORTH DOING AT ALL, IT’S WORTH DOING POORLY

  • But even today, one still occasionally hears a slogan, asserted as if it were a genuine Systems-Axiom, to the effect that: IF YOU ARE NOT PART OF THE SOLUTION, YOU ARE PART OF THE PROBLEM. —Catchy but specious. The correct form of the Theorem is as follows: THE SOLUTION IS OFTEN PART OF THE PROBLEM

  • We have encountered this process before. It is our old friend, Positive Feedback. IF THINGS SEEM TO BE GETTING WORSE EVEN FASTER THAN USUAL, CONSIDER THAT THE REMEDY MAY BE AT FAULT —or, more succinctly: STAY OUT OF THE POSITIVE FEEDBACK TRAP This phenomenon will be referred to as the Nasal Spray Effect in tribute to the millions of hay fever and “sinus” sufferers the world over who use nasal sprays to shrink their stuffy noses, only to discover that the rebound stuffiness that occurs when the spray wears off is worse than the original stuffiness.

  • ESCALATING THE WRONG SOLUTION DOES NOT IMPROVE THE OUTCOME —or more briefly, that the Nasal Spray Effect cannot be cured by using more nasal spray.

  • IF THINGS ARE ACTING VERY STRANGELY, CONSIDER THAT YOU MAY BE IN A FEEDBACK SITUATION

  • Furthermore, an uncanny element of Paradox is prominent in the few examples so far reported. Thus, the long survival of the British Monarchy is probably attributable to the fact that the King reignsbut does notrule

  • What has been said so far could be summarized in one deceptively simple Rule of Thumb: IF YOU CAN’T CHANGE THE SYSTEM, CHANGE THE FRAME—IT COMES TO THE SAME THING But a word of warning is in order. The novice Reframer, having discovered the hammer, so to speak, is likely to consider that almost everything needs hammering. This tendency should be resisted. We do not deny that occasionally one may encounter—or, even more happily, initiate—a successful Reframing. But we insist resolutely upon the Reality Principle, which states that such occurrences are the exception, not the rule. Tempered, moderate pessimism is the hallmark of the seasoned Systems-student.

  • Fundamental Theorem: NEW SYSTEMS GENERATE NEW PROBLEMS. Corollary (Occam’s Razor): SYSTEMS SHOULD NOT BE UNNECESSARILY MULTIPLIED. Law of Conservation of Anergy: THE TOTAL AMOUNT OF ANERGY IN THE UNIVERSE IS CONSTANT. Corollary: SYSTEMS OPERATE BY REDISTRIBUTING ANERGY INTO DIFFERENT FORMS AND INTO ACCUMULATIONS OF DIFFERENT SIZES.

  • The F.L.A.W. (Fundamental Law of Administrative Workings): THINGS ARE WHAT THEY ARE REPORTED TO BE. Alternative Forms of the F.L.A.W.: THE REAL WORLD IS WHAT IS REPORTED TO THE SYSTEM. IF IT ISN’T OFFICIAL, IT HASN’T HAPPENED. IF IT DIDN’T HAPPEN ON CAMERA, IT DIDN’T HAPPEN. And Conversely: IF THE SYSTEM SAYS IT HAPPENED, IT HAPPENED. Corollary #1: A SYSTEM IS NO BETTER THAN ITS SENSORY ORGANS. Corollary #2: TO THOSE WITHIN A SYSTEM THE OUTSIDE REALITY TENDS TO PALE AND DISAPPEAR. Corollary #3: THE BIGGER THE SYSTEM, THE NARROWER AND MORE SPECIALIZED THE INTERFACE WITH INDIVIDUALS. Harte’s Haunting Theorem: INFORMATION RARELY LEAKS UP Memory Joggers: THE CHART IS NOT THE PATIENT THE DOSSIER IS NOT THE PERSON